5,231 research outputs found

    A network epidemic model with preventive rewiring: comparative analysis of the initial phase

    Get PDF
    This paper is concerned with stochastic SIR and SEIR epidemic models on random networks in which individuals may rewire away from infected neighbors at some rate ω\omega (and reconnect to non-infectious individuals with probability α\alpha or else simply drop the edge if α=0\alpha=0), so-called preventive rewiring. The models are denoted SIR-ω\omega and SEIR-ω\omega, and we focus attention on the early stages of an outbreak, where we derive expression for the basic reproduction number R0R_0 and the expected degree of the infectious nodes E(DI)E(D_I) using two different approximation approaches. The first approach approximates the early spread of an epidemic by a branching process, whereas the second one uses pair approximation. The expressions are compared with the corresponding empirical means obtained from stochastic simulations of SIR-ω\omega and SEIR-ω\omega epidemics on Poisson and scale-free networks. Without rewiring of exposed nodes, the two approaches predict the same epidemic threshold and the same E(DI)E(D_I) for both types of epidemics, the latter being very close to the mean degree obtained from simulated epidemics over Poisson networks. Above the epidemic threshold, pairwise models overestimate the value of R0R_0 computed from simulations, which turns out to be very close to the one predicted by the branching process approximation. When exposed individuals also rewire with α>0\alpha > 0 (perhaps unaware of being infected), the two approaches give different epidemic thresholds, with the branching process approximation being more in agreement with simulations.Comment: 25 pages, 7 figure

    Household epidemic models with varying infection response

    Get PDF
    This paper is concerned with SIR (susceptible →\to infected →\to removed) household epidemic models in which the infection response may be either mild or severe, with the type of response also affecting the infectiousness of an individual. Two different models are analysed. In the first model, the infection status of an individual is predetermined, perhaps due to partial immunity, and in the second, the infection status of an individual depends on the infection status of its infector and on whether the individual was infected by a within- or between-household contact. The first scenario may be modelled using a multitype household epidemic model, and the second scenario by a model we denote by the infector-dependent-severity household epidemic model. Large population results of the two models are derived, with the focus being on the distribution of the total numbers of mild and severe cases in a typical household, of any given size, in the event that the epidemic becomes established. The aim of the paper is to investigate whether it is possible to determine which of the two underlying explanations is causing the varying response when given final size household outbreak data containing mild and severe cases. We conduct numerical studies which show that, given data on sufficiently many households, it is generally possible to discriminate between the two models by comparing the Kullback-Leibler divergence for the two fitted models to these data.Comment: 29 pages; submitted to Journal of Mathematical Biolog

    Extending DIRAC File Management with Erasure-Coding for efficient storage

    Get PDF
    The state of the art in Grid style data management is to achieve increased resilience of data via multiple complete replicas of data files across multiple storage endpoints. While this is effective, it is not the most space-efficient approach to resilience, especially when the reliability of individual storage endpoints is sufficiently high that only a few will be inactive at any point in time. We report on work performed as part of GridPP\cite{GridPP}, extending the Dirac File Catalogue and file management interface to allow the placement of erasure-coded files: each file distributed as N identically-sized chunks of data striped across a vector of storage endpoints, encoded such that any M chunks can be lost and the original file can be reconstructed. The tools developed are transparent to the user, and, as well as allowing up and downloading of data to Grid storage, also provide the possibility of parallelising access across all of the distributed chunks at once, improving data transfer and IO performance. We expect this approach to be of most interest to smaller VOs, who have tighter bounds on the storage available to them, but larger (WLCG) VOs may be interested as their total data increases during Run 2. We provide an analysis of the costs and benefits of the approach, along with future development and implementation plans in this area. In general, overheads for multiple file transfers provide the largest issue for competitiveness of this approach at present.Comment: 21st International Conference on Computing for High Energy and Nuclear Physics (CHEP2015

    NA62 Grid Monte Carlo Production Tools

    Get PDF
    The NA62 Grid Interface for Monte Carlo production and its related system components are pre- sented. This note is intended as a detailed description of the system for administration purposes, and as a user’s manual for Grid production management

    Semantic Attention: Effects of Modality, Lexicality and Semantic Content

    Full text link
    Since the discovery of the Stroop Effect in 1935 questions about the role of language vs. non-lexical stimuli in selective attention remain. Early researchers attributed the powerful distracting influence shown in the Stroop task, naming the color in which a spelled word is printed when incongruent with the color name the word spells, to an automaticity of language that gives it privileged access to meaning, but many others since have shown various ways to reduce or even reverse this distracting effect of an incongruent word. This study addresses this by using EEG to record neural activity along with reaction time and accuracy in a temporal flanker selective attention paradigm that uses all combinations of visual and auditory modalities with word and non-word lexicality as both flanking distractors and as targets, manipulating attention using semantically congruent and incongruent trials, thus controlling for the effects of modality, lexicality and semantic congruence on the selective attention task of ignoring the flankers and discriminating the target. We found that in addition to strong main effects of each of these factors, many complex two and three-way interaction effects shifted the effects of each factor depending on the levels of the other factors. We confirmed that the condition of semantic incongruence disrupts attention, shown by reduced performance accuracy and indexed by stronger peak of the N2 ERP between 250 and 310 ms after the target stimulus presentation. We found no support for the hypothesis that words have privileged automaticity, since stimulus lexicality, whether the distractor or target was a word or non-word, did not have a significant main effect on response accuracy. We found that the sensory modality of the distracting and target stimuli , whether auditory or visual, had complex interactions with their word or non-word lexicality that influenced the disrupting effects of semantic incongruence on attention. We propose a model based on 2 well-established frameworks in the neuroscience of attention, that of multiple networks governing 3 stages of attention processing, and parallel multi-modal sensory processing being bottlenecked by sequential language processing, to interpret these interacting effects on attention

    Analysis and improvement of data-set level file distribution in Disk Pool Manager

    Get PDF
    Of the three most widely used implementations of the WLCG Storage Element specification, Disk Pool Manager[1, 2] (DPM) has the simplest implementation of file placement balancing (StoRM doesn't attempt this, leaving it up to the underlying filesystem, which can be very sophisticated in itself). DPM uses a round-robin algorithm (with optional filesystem weighting), for placing files across filesystems and servers. This does a reasonable job of evenly distributing files across the storage array provided to it. However, it does not offer any guarantees of the evenness of distribution of that subset of files associated with a given "dataset" (which often maps onto a "directory" in the DPM namespace (DPNS)). It is useful to consider a concept of "balance", where an optimally balanced set of files indicates that the files are distributed evenly across all of the pool nodes. The best case performance of the round robin algorithm is to maintain balance, it has no mechanism to improve balance.<p></p> In the past year or more, larger DPM sites have noticed load spikes on individual disk servers, and suspected that these were exacerbated by excesses of files from popular datasets on those servers. We present here a software tool which analyses file distribution for all datasets in a DPM SE, providing a measure of the poorness of file location in this context. Further, the tool provides a list of file movement actions which will improve dataset-level file distribution, and can action those file movements itself. We present results of such an analysis on the UKI-SCOTGRID-GLASGOW Production DPM

    Monitoring in a grid cluster

    Get PDF
    The monitoring of a grid cluster (or of any piece of reasonably scaled IT infrastructure) is a key element in the robust and consistent running of that site. There are several factors which are important to the selection of a useful monitoring framework, which include ease of use, reliability, data input and output. It is critical that data can be drawn from different instrumentation packages and collected in the framework to allow for a uniform view of the running of a site. It is also very useful to allow different views and transformations of this data to allow its manipulation for different purposes, perhaps unknown at the initial time of installation. In this context, we present the findings of an investigation of the Graphite monitoring framework and its use at the ScotGrid Glasgow site. In particular, we examine the messaging system used by the framework and means to extract data from different tools, including the existing framework Ganglia which is in use at many sites, in addition to adapting and parsing data streams from external monitoring frameworks and websites
    • …
    corecore